翻訳と辞書
Words near each other
・ Deckscar
・ Declamation
・ Declan (album)
・ Declan (disambiguation)
・ Declan (given name)
・ Declan Affley
・ Declan Barron
・ Declan Bennett
・ Declan Bonner
・ Declan Bree
・ Decision problem
・ Decision rule
・ Decision Sciences
・ Decision Sciences Institute
・ Decision Sciences Journal of Innovative Education
Decision stump
・ Decision Support Panel
・ Decision support system
・ Decision table
・ Decision Theater
・ Decision theology
・ Decision theory
・ Decision tree
・ Decision tree learning
・ Decision tree model
・ Decision-making
・ Decision-making models
・ Decision-making paradox
・ Decision-making software
・ Decision-matrix method


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Decision stump : ウィキペディア英語版
Decision stump

A decision stump is a machine learning model consisting of a one-level decision tree.〔 That is, it is a decision tree with one internal node (the root) which is immediately connected to the terminal nodes (its leaves). A decision stump makes a prediction based on the value of just a single input feature. Sometimes they are also called 1-rules.
Depending on the type of the input feature, several variations are possible. For nominal features, one may build a stump which contains a leaf for each possible feature value〔This classifier is implemented in Weka under the name OneR (for "1-rule").〕 or a stump with the two leaves, one of which corresponds to some chosen category, and the other leaf to all the other categories.〔This is what has been implemented in Weka's DecisionStump classifier.〕 For binary features these two schemes are identical. A missing value may be treated as a yet another category.〔
For continuous features, usually, some threshold feature value is selected, and the stump contains two leaves — for values below and above the threshold. However, rarely, multiple thresholds may be chosen and the stump therefore contains three or more leaves.
Decision stumps are often〔Reyzin, Lev; and Schapire, Robert E. (2006); (''How Boosting the Margin Can Also Boost Classifier Complexity'' ), in ''ICML′06: Proceedings of the 23rd international conference on Machine Learning'', pp. 753-760〕 used as components (called "weak learners" or "base learners") in machine learning ensemble techniques such as bagging and boosting. For example, a state-of-the-art Viola–Jones face detection algorithm employs AdaBoost with decision stumps as weak learners.〔Viola, Paul; and Jones, Michael J. (2004); (''Robust Real-Time Face Detection'' ), International Journal of Computer Vision, 57(2), 137–154〕
The term "decision stump" was coined in a 1992 ICML paper by Wayne Iba and Pat Langley.〔Iba, Wayne; and Langley, Pat (1992); (''Induction of One-Level Decision Trees'' ), in ''ML92: Proceedings of the Ninth International Conference on Machine Learning, Aberdeen, Scotland, 1–3 July 1992'', San Francisco, CA: Morgan Kaufmann, pp. 233–240〕〔Oliver, Jonathan J.; and Hand, David (1994); ''Averaging Over Decision Stumps'', in ''Machine Learning: ECML-94, European Conference on Machine Learning, Catania, Italy, April 6–8, 1994, Proceedings'', Lecture Notes in Computer Science (LNCS) 784, Springer, pp. 231–241 ISBN 3-540-57868-4
/>Quote: "These simple rules are in effect severely pruned decision trees and have been termed ''decision stumps'' (Iba and Langley )".〕
== References ==


抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Decision stump」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.